2,872 research outputs found

    Discussion of ``2004 IMS Medallion Lecture: Local Rademacher complexities and oracle inequalities in risk minimization'' by V. Koltchinskii

    Full text link
    Discussion of ``2004 IMS Medallion Lecture: Local Rademacher complexities and oracle inequalities in risk minimization'' by V. Koltchinskii [arXiv:0708.0083]Comment: Published at http://dx.doi.org/10.1214/009053606000001073 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    On non-asymptotic bounds for estimation in generalized linear models with highly correlated design

    Full text link
    We study a high-dimensional generalized linear model and penalized empirical risk minimization with β„“1\ell_1 penalty. Our aim is to provide a non-trivial illustration that non-asymptotic bounds for the estimator can be obtained without relying on the chaining technique and/or the peeling device.Comment: Published at http://dx.doi.org/10.1214/074921707000000319 in the IMS Lecture Notes Monograph Series (http://www.imstat.org/publications/lecnotes.htm) by the Institute of Mathematical Statistics (http://www.imstat.org

    Ο‡2\chi^2-confidence sets in high-dimensional regression

    Full text link
    We study a high-dimensional regression model. Aim is to construct a confidence set for a given group of regression coefficients, treating all other regression coefficients as nuisance parameters. We apply a one-step procedure with the square-root Lasso as initial estimator and a multivariate square-root Lasso for constructing a surrogate Fisher information matrix. The multivariate square-root Lasso is based on nuclear norm loss with β„“1\ell_1-penalty. We show that this procedure leads to an asymptotically Ο‡2\chi^2-distributed pivot, with a remainder term depending only on the β„“1\ell_1-error of the initial estimator. We show that under β„“1\ell_1-sparsity conditions on the regression coefficients Ξ²0\beta^0 the square-root Lasso produces to a consistent estimator of the noise variance and we establish sharp oracle inequalities which show that the remainder term is small under further sparsity conditions on Ξ²0\beta^0 and compatibility conditions on the design.Comment: 22 page
    • …
    corecore